Improving the Efficiency of Gibbs Sampling for Probabilistic Logical Models by Means of Program Specialization
نویسنده
چکیده
There is currently a large interest in probabilistic logical models. A popular algorithm for approximate probabilistic inference with such models is Gibbs sampling. From a computational perspective, Gibbs sampling boils down to repeatedly executing certain queries on a knowledge base composed of a static part and a dynamic part. The larger the static part, the more redundancy there is in these repeated calls. This is problematic since inefficient Gibbs sampling yields poor approximations. We show how to apply program specialization to make Gibbs sampling more efficient. Concretely, we develop an algorithm that specializes the definitions of the query-predicates with respect to the static part of the knowledge base. In experiments on real-world benchmarks we obtain speedups of up to an order of magnitude.
منابع مشابه
Improving the Efficiency of Approximate Inference for Probabilistic Logical Models by means of Program Specialization
We consider the task of performing probabilistic inference with probabilistic logical models. Many algorithms for approximate inference with such models are based on sampling. From a logic programming perspective, sampling boils down to repeatedly calling the same queries on a knowledge base composed of a static and a dynamic part. The larger the static part, the more redundancy there is in the...
متن کاملFast Inference for Interactive Models of Text
Probabilistic models are a useful means for analyzing large text corpora. Integrating such models with human interaction enables many new use cases. However, adding human interaction to probabilistic models requires inference algorithms which are both fast and accurate. We explore the use of Iterated Conditional Modes as a fast alternative to Gibbs sampling or variational EM. We demonstrate sup...
متن کاملSampling from Probabilistic Submodular Models
Submodular and supermodular functions have found wide applicability in machine learning, capturing notions such as diversity and regularity, respectively. These notions have deep consequences for optimization, and the problem of (approximately) optimizing submodular functions has received much attention. However, beyond optimization, these notions allow specifying expressive probabilistic model...
متن کاملApproximating Bayes Estimates by Means of the Tierney Kadane, Importance Sampling and Metropolis-Hastings within Gibbs Methods in the Poisson-Exponential Distribution: A Comparative Study
Here, we work on the problem of point estimation of the parameters of the Poisson-exponential distribution through the Bayesian and maximum likelihood methods based on complete samples. The point Bayes estimates under the symmetric squared error loss (SEL) function are approximated using three methods, namely the Tierney Kadane approximation method, the importance sampling method and the Metrop...
متن کاملContext-Specific Independence in Directed Relational Probabilistic Models and its Influence on the Efficiency of Gibbs Sampling
There is currently a large interest in relational probabilistic models. While the concept of context-specific independence (CSI) has been well-studied for models such as Bayesian networks, this is not the case for relational probabilistic models. In this paper we show that directed relational probabilistic models often exhibit CSI by identifying three different sources of CSI in such models (tw...
متن کامل